NCGIA National Center for Geographic Information and Analysis Selected Annotated Bibliography On
نویسندگان
چکیده
Generally, there is a lack of concepts expressing the accuracy of the digital data base of spatial information systems. Usually some point measures of positional accuracy, for example the root mean square positional error (RMS) or the circular error probable (CEP), are the only information available on geometrical accuracy. But the user is rather interested in measures for accuracy of objects such as lines, parcels and polygons. The purpose of this paper is to propose a method of describing the accuracy of lines and polygons derived from positional errors of points. The uncertainty of the digital representation of a line is often visualized by a band which is an area defined by two parallels of the most probable position of the line. The constant width of this band is usually interpreted as the diameter of the error circles of all points of the line. This, however, is a simplification. By the law of error propagation it is easily demonstrated, that points in the middle of the line possess smaller RMS values than the end points. Because of this the correct error-band is not a rectangle but a band bordered by sagging rather than by straight lines. Further investigations have been carried out to determine the shape of the band at the end of a line and at comer and junction points of polygons. Chrisman N R 1983 The Role of Quality Information in the Long-term Functioning of a Geographic Information System. Proceedings of Auto Carto 6, 1: 303-312. Abstract: A geographic information system requires a method to maintain its contents over the longterm. This process must handle quality components along with the data directly depicted on a map. Quality information includes lineage records, accuracies of position and classification, integrity of data structure and temporal reference, among other things. The quality component informs users of suitability for their applications, and it also offers distinct advantages to data producers with responsibility for long-term maintenance. Quality information is not currently maintained by most available software. New data structures and algorithms will be required to meet this need. Chrisman N R 1984 The role of Quality Information in the LongTerm Functioning of a Geographical Information System Cartographica 21: 79-87. Chrisman N R 1986 Obtaining Information on Quality of Digital Data Proceedings Auto Carto London 1: 350-385. Abstract: Quality information has been recognized as an important component of geographic information systems, at least in theory. This paper expands on the definitions of quality information with particular reference to sources. Many system developers believe that quality information will make enormous demands, and that the user may be unwilling to pay for such extravagance. This paper provides a counter argument, maintaining that quality information is currently processed and maintained. Digital cartography usually proceeds on the presumption that the manual system is hopelessly outdated and encrusted with peculiar rites of dubious importance. This paper argues that quality information is a continual concern in the traditional practice of cartography. Examination of a few diverse cases reveals a broad range of quality information maintained by producers which can be fully integrated into current spatial databases. Better exploitation of quality information can improve communication with users and it can also cut costs inside the producing agency. Chrisman N R 1987 The Accuracy of Map Overlays: A Reassessment. Landscape and Urban Planning, 14: 427-439. Chrisman N R 1989 Error in Categorical Maps: Testing versus Simulation. Proceedings of Auto Carto 9, 521-529. Abstract: Understanding error in maps requires a combination of theory (new models) and practice (understanding how error can be measured in real applications). While other research emphasizes mathematical models to simulate error, a practical test provides a more useful judge of cartographic data quality. A comprehensive test, overlaying two categorical maps intended to be the same, can provide an estimate of separate components of error including positional and attribute accuracy along with scale effects. Chrisman N R 1989 Modeling Error in overlaid categorical maps. Accuracy of Spatial Databases. London, Taylor and Francis. 21-34. Abstract: A model for error in geographic information should take high priority in the research agenda to support the recent explosion in use of GIS. While the treatment of surfaces and spatial autocorrelation is more mathematically tractable, much of the GIS layers consist of categorical coverages, analyzed through polygon overlay. This paper provides a basic taxonomy of forms of error in this type of information and provides some questions for future research. A model for error in geographic information should take high priority in the research agenda to support the recent explosion in use of GIS. While the treatment of surfaces and spatial autocorrelation is more mathematically tractable, much of the GIS layers consist of categorical coverages, analyzed through polygon overlay. This paper provides a basic taxonomy of forms of error in this type of information and provides some questions for future research. Chrisman N R 1991 The error component in spatial data. In Geographical Information Systems: principles and applications D J Maguire, M F Goodchild and D W Rhind (Eds.), 1: 165-174. Essex: Longman Scientific and Technical. Abstract: Although most data gathering disciplines treat error as an embarrassing issue to be expunged, the error inherent in spatial data deserves closer attention and public understanding. This chapter presents a review of the components of error in spatial data, based on Sinton’s scheme of roles in the measurement process and also on the categories in the US Proposed Standard for Digital Cartographic Data Quality. Chrisman N R and Lester M 1991 A Diagnostic Test for Error in Categorical Maps. Proceedings Auto-Carto 10: Technical Papers of the 1991 ACSM-ASPRS Annual Convention, 6: 330-348. Baltimore: ACSM-ASPRS. Abstract: A test based on exhaustive overlay of two categorical maps provides a description of error distinguished into the likely sources of that error (a diagnosis of the error). The results of the overlay are characterized by geometric, topological and attribute criteria to separate the most likely positional errors from the attribute errors. This paper applies the proposed test to a simple land cover map, which was replicated by a second interpreter. Results diagnose the positional inaccuracy and misidentifications common in such a GIS layer. Adopting this test will target the efforts of a producer’s quality control functions, and it will also clarify fitness for the particular uses contemplated by others. Clarke A 1991 Data Quality Reporting in the Proposed Australian Spatial Data Transfer Standard. Proceedings of the Symposium on Spatial Database Accuracy, 1: 252-259. Melbourne: University of Melbourne. Abstract: Standards Australia proposes to clone the U.S. Spatial Data Transfer Standard (SDTS) to supersede the current Australian spatial data standard, AS 2482. SDTS adopts a ’truth in labeling’ approach to data quality, requiring users to report what is known about the lineage, positional accuracy attribute accuracy, logical consistency and completeness of the data being transferred. The AS 2482 and SDTS data quality reporting requirements are described, and the implications for data producers are discussed. It is proposed that spatial data users critically review the quality of the SDTS quality reports supplied by data producers, and that they specify their data quality requirements within the SDTS report structure. Couclelis H, Beard M K and Mackaness W A 1992 Two Perspectives on Data Quality NCGIA Technical Report 92-12 Dec, NCGIA, University of California, Santa Barbara. The first report discusses the impediments to effective quality control, and proposes a conceptual model to monitor GIS product quality at any state of deriving an application; the second outlines a research agenda based on the identification of impediments to data quality. Coward P and Heywood 11991 Aspects of uncertainty in spatial decision making. Proceedings of EGIS ’91, 1: 233-242. Brussels: EGIS Foundation. Coward P and Heywood 11992 Multiple Criteria Decision Analysis as a Method for Quality Assurance in GIS. Proceedings The Canadian Conference on GIS , 1: 643-653). Ottawa. Abstract: Quality assurance is an area of applied GIS which is growing in importance. At present much of the current research involves the use of complex statistical techniques which require large amounts of processing power and are difficult to comprehend by potential users. This paper will outline research underway at the University of Salford towards developing a hierarchical structure for assigning significance and weights to error sources in spatial information. This approach forms the basic building blocks for a user centered approach to quality assurance. The techniques employed draw upon Multiple Criteria Decision Analysis (MCDA) as a working methodology. The model described here is implemented using procedures which already exist in the SPANS GIS. Deichmann U, Ansel L and Goodchild M F 1992 Estimation Issues In The Areal Interpolation Of Socioeconomic Data Proceedings of EGIS 92 254-263 Abstract: Spatial data are collected and represented as attributes of spatial objects embedded in the plane. We define basis change as the transfer of attributes from one set of objects to another. Methods of basic change for socioeconomic data are reviewed, and are seen to differ in the assumptions they make about underlying density surfaces. We extend these methods to more general cases, and provide an illustration using California data. The implementation of this framework within a GIS is discussed. Detrekoi A 1994 Data Quality Management In GIS Systems. Computers Environment And Urban Systems, 1994 Mar-Apr, 18 (2):81-85. Dillard R 1992 Using Data Quality Measures In Decision-Making Algorithms. IEEE Eaert, 1992 Dec, 7 (6):63-72. Dunn R, Harrison A R and White J C 1990 Positional accuracy and measurement error in digital databases of land use: an empirical study. International Journal of Geographic Information Systems 4(4): 385-398. Abstract: This paper discusses the issues of positional accuracy and measurement error in the context of a large empirical study of landscape change in England and Wales. The epsilon band model of digitizing accuracy is used to make estimates of the levels of positional uncertainty and measurement error that is due to digitizing polygon outlines. The degree of error expected had the same polygons been captured in raster format is then detem-iined. These results prompt a general discussion of the nature of error in spatial databases Dutton G 1992 Handling Positional Uncertainty in Spatial Databases. In P. Bresnahan, E. Corwin and D. Cowen (Ed.), Proceedings of the 5th International Sy=osium on Spatial Data Handling, 2: 460-469. Charleston: IGU Commission of GIS. Abstract: Many inaccuracies in digitized map features derive from circumstances of data capture, others from the ways in which features are represented in a database, and others stem from how data are manipulated, particularly when datasets are merged or graphic scales change. This paper describes one way in which vector representation of digitized map features limits their reliability, and by implication, the quality and usability of databases of digitized maps. It describes how locational uncertainty about punctiform, linear and areal cartographic features derives from mislocation of the coordinates that define them regardless of what causes these mislocations in the first place such as hardware error or operator blunders during digitizing, line widths on source maps and numerical instability in computations and goes on to specify an approach to handling this inevitable source of error in a systematic way. Dutton G and Buttenfield BP 1993 Scale Cbange Via Hierarchical Coarsening: Cartographic Properties of Quaternary Triangular Meshes. Proceedings. 16th Conference of the International Canographic Association 3-8 May 1993, Cologne, Germany, vol. 2: 847-862. Abstract: A data model and coordinate system for global spatial data is described: called quaternary triangular mesh (QTM), this model identifies geographic locations as a nested hierarchy of triangles created by subdividing the faces of an octahedron embedded in a planet. At each successive level of subdivision, each facet begets four children, each blossoming forth to occupy slightly more than one-fourth its parent’s area. After 16 levels of subdivision, facets are roughly the size of Landsat pixels. By the 30th level, the planet is covered by millions of trillions of facets, each about one square centimeter in area. Once coordinates of puncti form, linear and polygonal digital cartographic features are encoded into hierarchical addresses, they may be retrieved at any level of detail (doublings of scale), as desired. QTM addresses can be retransformed into geographic coordinates by selecting centers of triangular facets or at other locations within them (no unique mapping from QTM to latitude and longitude exists). The paper concludes with a discussion of cartographic problems that QTM encoding of map features can address, perhaps even solve. Edwards G 1992 The Integration of Remotely Sensed Data Analysis into GIS: Time and Uncertainty Management Needs. In The Canadian Conference on GIS Proceedin , 1: 432-440. Ottawa Abstract: The integration of remote sensing image analysis with GIS technology requires, among other issues, the development of automated and efficient time-management capabilities in GIS software. Remotely sensed images are obtained at intervals ranging from a few minutes to many years, and hence the time management of these data is often more exacting than that of typical cartographic data. Furthermore, effective means of handling uncertainty must be introduced if the results of temporal analyses are to be meaningful. These issues are discussed in some detail, along with research being carried out to put such a capability together. Eginton D W 1992 Achieving Consensus on GIS Map Accuracy. In D. M. Freund (Ed.), URISA 1992 Annual Conference Proceedings, 1: 12-23. Washington, D. C.: Urban and Regional Information Systems Association. Abstract: Coordination among a multiplicity of local government agencies and public service providers is often critical to the success of a GIS program. Failure to carefully consider accuracy standards can limit the usefulness of the GIS, preclude efficient sharing of data, limit options for cost sharing, and result in costly duplication of efforts. Geodetic and survey control, land parcel and street centerline data are strategic components of a GIS database, since they provide a basis for registration or creation of other sets of thematic data. Plans for development of these three components of the database should be coordinated and should address several key issues, including: future GIS application requirements, database design, sources of data, alternative approaches to establishment of control coordinates, potential uses of aerial photogrammetry, and alternative parcel conversion procedures. Elmes G A 1992 Data Quality Issues in User Interface Design for a Knowledge-Based Decision Support System. In P. Bresnahan, E. Corwin and D. Cowen (Ed.), Proceedings of the 5th International Symposium on Spatial Data Handling, 1: 303-312. Charleston: IGU Commission of GIS. Abstract: An ’intelligent’ geographic information system (IGIS) is being developed as part of a spatial decision support expert system to advise on the management of gypsy moth, a serious pest of deciduous forests in North America. The gypsy moth expert system (GypsES) has three primary knowledge-based components: hazard rating and risk assessment; insect monitoring and prediction; and treatment. Each component is embedded in a GIS designed to have certain aspects of intelligent functionality. The objectives of the development of IGIS for GypsES are to: define a conceptual framework of intelligent functions for spatial data handling; identify the nature of information provided by IGIS components; identify the sources, nature and content of knowledge necessary to develop intelligent functionality; and evaluate IGIS under GypsES test conditions. Fain M A 1991 Quality Through Conversion Process Management. Proceedings GIS/LIS ’9 1, 1: 137-141. Atlanta. Abstract: The quality of the data destined for Geographic Information Systems continues to be a topic of debate and an area that requires serious scrutiny. Quality is first defined in a measurable manner and the costs of poor quality data are discussed. The way to achieve data quality is to view conversion as a process. Process management makes a project-by-project approach obsolete and is used to implement a "Zero defect" philosophy. High quality throughout the process then becomes less expensive than a final quality review phase that requires re-routing and re-working after the fact. The process itself ends up assuming a great deal of the responsibility for the quality of the end product, in effect eliminating opportunities for human error. Fenstermaker L K 1991 A Proposed Approach for National to Global Scale Error Assessments. Proceedings GIS/LIS ’91, 1: 293-300. Atlanta. Abstract: The error assessment of large area classified imagery presents unique problems to the remote sensing scientist. The primary problem is usually a lack of resources to collect sufficient ground information. This paper proposes a field verification and accuracy assessment approach that uses field, aerial and existing data from a representative area. The proposed approach has three phases. The first phase partitions the large area in relatively homogeneous strata. For example the United States may be partitioned into ecoregion types. Then within those strata representative areas that contain most of the classes of interest are selected. The third phase is the stratified random sampling by class, ensuring at least 50 points per class or 100 percent representation of small area resources. Firns P and Benwell G 1991 ER on the Side of Spatial Accuracy. Proceedings of the Symposium on Spatial Database Accuracy, 1: 192-202. Melbourne: University of Melbourne. Abstract: There are two main issues pertaining to the accuracy of spatial databases. They are defined here as: (a) Spatial accuracy or the ability of the database to accurately represent the position of objects in space and relative to each other, (b) Descriptive accuracy or the ability of the database to accurately represent the state of objects in terms of attribute values and non-spatial relationships between each other. This paper is concerned with descriptive accuracy and specifically examines the implications for descriptive accuracy of inappropriate database structures. The entity-relationship (ER) model is used as a framework for analyzing appropriateness of alternative database structures for a specific application and it is shown that an inappropriate database structure as specified by an ER model can preclude the production of accurate information. Fisher P F 1993 Conveying Object Based Meta Information Proceedings of Auto Carto 11, Minneapolis, Minnesota, Oct 30 Nov 1, 113-122. Abstract: Metadata and lineage are two related areas of research which have received considerable recent attention from the GIS community, but that attention has largely focused on meta-information at the level of the image, coverage, or data layer. Researchers working in other areas such as error and uncertainty handling, have focused on a lower level within the spatial data, but can also be considered to have been working on metadata. Users require access to the whole set of metadata from the level of the mapset to the elemental object, i.e. the point, line, polygon or pixel. This paper attempts to draw these different research strands together and suggests an umbrella framework for metadata which can be translated to a number of different flavors of metadata for which accuracy, lineage, statistics and visualization are exemplified. This leads to discussion of an interface design which enables rapid access to the metadata. Fisher P F 1991 First Experiments in Viewshed Uncertainty: Ile accuracy of the viewshed area. Photoganimetric Engineering and Remote Senain LVII(10): 1321-1328. Abstract: Digital elevation models are computer representations of a portion of the land surface. The elevations recorded in the DEM are not, however, without error, and the United States Geological Survey publish a root mean squared error for each DEM. The research reported here examines how that error propagates into derivative products resulting from geographic information system type operations. One product from a DEM, the focus of this paper, is the viewshed. The viewshed is the area observable from a viewing location versus that which is invisible. In this research, repeated error fields with varying parameters are added to the original DEM, and the viewshed is determined in the resulting noisy DEM. Results show that the are of the viewshed calculated in the original DEM may significantly over estimate the viewshed area. Fisher P F 1987 The Nature of Soil Data in GIS: Error or Uncertainty. Proceedings of the International Geo=phic Information Systems (IGIS) Sympgsium: The Research A3: 307318. Fisher P F 1991 Simulation of the Uncertainty of a Viewshed. In Auto-Carto 10: Technical Papers of the 1991 ACSM-ASPRS Annual Convention, 6: 205-218. Baltimore: ACSM-ASPRS. Abstract: One of the most widely available procedures packaged with GIS for the analysis of a Digital Elevation Model (DEM) is the identification of the viewable area, or the viewshed. The elevations recorded in the DEM do, however, contain error, and the USGS, for example, publishes a Root Mean Squared Error (RMSE) for each DEM. Research reported here assesses the uncertainty of locations being within a viewshed, given the published error for the DEM. In this research, repeated error fields are simulated with variable spatial autocorrelation, and added to the original DEM. The viewshed is then determined in the resulting noisy DEM. Results show that using the basic assumption of spatial independence in the error which is implicit in the RMSE remarkably few points are reliably within the viewshed. With spatially autocorrelated noise, the reliability is higher, but still should be cause for concern to many using viewshed procedures. Foote-Lennox, Thom 1992 The Need For Coordinate Quality Metrics In Predicting Accuracy In GIS Queries Proceedings of The 13 Annual ESRI User Conference, 121-127. Abstract: A coordinate pair defines the center of some object (example: a ZIP-code, fire district, address) in a GIS database. The coordinates alone provide little information that can be used to determine the quality of placement. The user must assume that the accuracy is the same for each coordinate and trust that accuracy is sufficient. This paper calls for a new standard in which additional metrics would be included with each coordinate pair. These additional metrics allow the measurement of positional error, and in the context of a specific geographic query, allow a confidence metric to be calculated. Fox C, Levitin A. Redman T. 1994 The Notion Of Data And Its Quality Dimensions. Information Processing and Management, 1994 Jan-Feb, 30 (1):9-19. Ganter John H 1993 Metadata Management In An Environmental GIS For Multidisciplinary Users Proceedines GIS/LIS 93, Nov 2-4,1993 Minneapolis Minnesota 1: 233-245. Abstract: A GIS interface to support non-specialists must collect and maintain metadata, describing iterative work sessions and interim or final products such as maps and analyses. These capabilities, which are not yet part of commercial software, allow the user to confidently navigate underlying databases and quickly return to previous maps and analyses with minimal effort. The same interface would assist organizations that manage growing collections of derivative products (e.g., maps with source data, views, layouts, legends, etc. require storage and future retreivability). Metadata is a significant topic in the GIS literature because it is linked to data interchange, compatibility, accuracy, and quality. There has been little emphasis on tracking derivative datasets, analyses, and dynamic map compositions once data is within an organization. A prototype interface addressing these needs was developed at the Los Alamos National Laboratory (LANL) environmental restoration program. The interface uses a metaphor wherein the user can manage a collection of Snapshots that, with outward simplicity, describe data themes, storage locations, parameters, and map layout. Garza R J and Foresman T W 1991 Embedding Quality Into County-Wide Data conversion. Proceedings GIS/LIS ’91, 1:130-136. Atlanta. Introduction: Through careful planning and attention to detailed process control design, efficient and automated database construction can be performed with embedded quality control for a geographic information system. Establishing the appropriate quality control for data conversion efforts requires formalization of data handling procedures, conversion techniques, documentation standards, and product specifications. While planning, implementation, and enforcement of these methodologies are major components of a quality assurance approach, group commitment and vigilant monitoring effort are required to fulfill the ultimate strategic goals prescribed. Gong, Peng and Jun Chen 1992 Boundary Uncertainties In Digitized Maps I: Some Possible Determination Methods Proceedings GIS/LIS 92, 1: 274-28 1. Abstract: In traditional thematic mapping, errors or uncertainties are not associated with the final map product. When such maps are used in a map compilation or a map overlay process to generate derivative maps, the lack of error or uncertainty estimates makes it difficult to study how errors or uncertainties are propagated. In this paper, we report an on-going research project regarding methods on how to determine, represent, and display boundary uncertainties in categorical (areaclass) maps. Selective sampling, curve fitting and blending functions can be used for determining boundary uncertainties and their distributions. An experiment was designed to illustrate the different levels of boundary uncertainties and uncertainty determination methods. Gonzalez M 1994 Improving Data Quality Awareness In The United-States Federal Statistical Agencies. American Statistician, 1994
منابع مشابه
NCGIA National Center for Geographic Information and Analysis A Review of Spatial Population
متن کامل
NCGIA National Center for Geographic Information and Analysis
Environmental Equity in Los Angeles
متن کاملTemporal Relations in Geographic Information
A workshop on temporal relations in Geographic Information Systems (GIS) was held on October 12-13, 1990, at the University of Maine. The meeting was sponsored by the National Center for Geographic Information and Analysis (NCGIA). Seventeen specialists gathered from the fields of Geography, GIS, and Computer Science to discuss users' requirements of temporal GIS and to identify the research is...
متن کاملNational Center for Geographic Information and Analysis Visual Interfaces to Geometry
This report presents the results of a two-day workshop on "Visual Interfaces to Geometry" which was conducted by Werner Kuhn and Max Egenhofer at ACM's CHI'90 Conference on Human Factors in Computing Systems. From the perspective of the National Center for Geographic Information and Analysis (NCGIA), this workshop provided an opportunity to share and discuss results of NCGIA's Research Initiati...
متن کاملA Research Agenda for Geographic Information and Analysis
Preface In our original NCGIA proposal to the National Science Foundation (NSF) in 1988, we presented a two-part research plan that responded directly to the solicitation document issued by NSF in late 1987. The first part (subsequently published as NCGIA, 1989) was a discussion of impediments to the successful use of GIS in geographic analysis, together with the research necessary to overcome ...
متن کاملCategorizing Binary Topological Relations Between Regions, Lines, and Points in Geographic Databases
This research was partially funded by NSF grant No. IRI-9309230 and grants from Intergraph Corporation. Additional support from NSF for the NCGIA under No. SBR-9204141 is gratefully acknowledged. Max J. Egenhofer University of Maine, National Center for Geographic Information and Analysis and Department of Surveying Engineering, Department of Computer Science, University of Maine, Orono, ME 044...
متن کامل